Project 2 - Model Calibration Project - Using Neural Network to Calibrate Heston Volatility Model¶
Objective:¶
To predict “tomorrow’s price” for European Call options as per Heston model parameters estimated using the trained neural network.
Things to know:¶
- The Heston model assumes that the underlying stock price follows stochastic process and the volatility follows a Cox, Ingersoll, and Ross process. Hence, under the historical / physical measure of probability $P$, the Heston model is represented by the following bivariate system of Stochastic Differential Equations (SDEs): $$ dS_t = \mu S_t \, dt + \sqrt{v_t} \, S_t \, dW_{1,t} $$
$$ dv_t = \kappa (\theta - v_t) \, dt + \sigma \sqrt{v_t} \, dW_{2,t} $$
$$ where, \ \ \mathbb{E}^P[dW_{1,t} \, dW_{2,t}] = \rho \, dt $$
- For the purposes of pricing, the same needs to be determined under the risk-neutral probability measure $Q$. This is achieved with the application of Girsanov’s theorem. The system of SDEs under risk-neutral probability measure is then given by: $$ dS_t = r S_t \, dt + \sqrt{v_t} \, S_t \, d\widetilde{W}_{1,t} $$
$$ dv_t = \kappa^* (\theta^* - v_t) \, dt + \sigma \sqrt{v_t} \, d\widetilde{W}_{2,t} $$
$$ where, \ \ \mathbb{E}^Q[d\widetilde{W}_{1,t} \, d\widetilde{W}_{2,t}] = \rho \, dt $$
$$ and \ \ \kappa^* = \kappa + \lambda , \theta^* = \frac{\kappa \theta}{\kappa + \lambda}$$
This gives the following parameters to be calibrated:
- $\kappa$ : the mean reversion speed for the variance
- $\theta$ : the mean reversion level for the variance
- $\sigma$ : the volatility of the variance
- $v_0$ : the initial (time zero) level of the variance
- $\rho$ : the correlation between the two Brownian motions
- $\lambda$ : the volatility risk premium
When $\lambda = 0$, $\kappa^* = \kappa$ and $\theta^* = \theta$, so that these parameters under the physical and risk-neutral measures are the same. Following the guidance in the book "The Heston Model and Its Extensions in Matlab and C#" by Fabrice D Rouah, $\lambda$ is set to $0$ for the purpose of this project. This leaves the other five parameters to be calibrated.
The Strategy:¶
The methodology to recover these five parameters (i.e. $\kappa$, $\theta$, $\sigma$, $v_0$, $\rho$) and using them for predicting “tomorrow’s price” can be broken down into three main steps:
Generating synthetic market data : Since the Heston model parameters are not apparent from any kind of financial market data, as a starting point, it is imperative to generate the data that mimics the market behavior.
Training neural network : This will be done by defining input layer (i.e. features or variables that are readily available in the real-world financial market data) and output layer (i.e. the 5 parameters to be recovered). The network will then be fed the synthetic market data for training so that it determines the relations between given set of features and their corresponding parameter values.
Calibrating the parameters : The calibration of Heston model parameters will been done in two ways to study the numerical results: - Calibration with testing data - Calibration with real market data
Part 1 - Generating Synthetic Market Data¶
- Since the Heston model parameters are not apparent from any kind of financial market data, as a starting point, it is imperative to generate the data that mimics the market behavior.
- Care must be taken while defining the sampling range for each of the variables. To generate the data that is as realistic as possible, reference has been drawn from current market data and existing research papers. One such guiding research paper that uses LHS is "A neural network-based framework for financial model calibration" by Liu, S., Borovykh, A., Grzelak, L.A. et al.
- With the samples of variables, heston model based price can be calculated to complete each of the sample set.
1.1 Defining the class for calculating Heston price¶
The call option price as per Heston Model can be determined by the method of characteristic functions. The solution is of the form corresponding the Black-Scholes model as follows: $$ C(K) = S_t P_1 - K e^{-r \tau} P_2 $$
Here, $P_1$ is the delta of the European call option and $P_2$ is the conditional risk neutral probability that $S_T$ > $K$.
For the purposes of determining the call option price and simplifying the solution to single integral, guidance has been sought from Fabrice D Rouah's book. This can be conveniently implemented by defining a class for calculating the price as per Heston model.
Python Libraries: numpy, math, cmath, matplotlib. If these libraries are not installed, run 'pip install library_name' in terminal or command prompt.
- numpy: The fundamental package for scientific computing. It is useful for numerical operations.
- math: Provides access to mathematical functions for real numbers (e.g., trigonometry, logarithms, constants).
- cmath: Similar to math, but supports complex numbers and operations in the complex plane.
- matplotlib.pyplot: The standard library for creating static, animated, and interactive visualizations in Python.
#Importing the libraries
import numpy as np
import math as math
import cmath as cmath
import matplotlib.pyplot as plt
#Defining the class to hold the relevant functions
class Heston(object):
#Defining the constructor for initiating the class
def __init__(self,S0,K,tau,r,kappa,theta,v0,lamda,sigma,rho):
self.x0=math.log(S0)
self.ln_k=math.log(K)
self.r=r
self.v0=v0
self.kappa=kappa
self.theta=theta
self.lamda=lamda
self.sigma=sigma
self.rho=rho
self.tau=tau
self.a=kappa*theta
self.u=[0.5,-0.5]
self.b=[kappa+lamda-rho*sigma,kappa+lamda]
# Function for resetting the constant parameters
def reset_parameters(self,S0,K,tau,r,kappa,theta,v0,lamda,sigma,rho):
self.x0=math.log(S0)
self.ln_k=math.log(K)
self.r=r
self.v0=v0
self.kappa=kappa
self.theta=theta
self.lamda=lamda
self.sigma=sigma
self.rho=rho
self.tau=tau
self.a=kappa*theta
self.u=[0.5,-0.5]
self.b=[kappa+lamda-rho*sigma,kappa+lamda]
#Return the characteristic functions f1 and f2, each of which has a real and a complex part
def characteristic_func(self,phi):
d=[0.0,0.0];g=[0.0,0.0];C=[0.0,0.0];D=[0.0,0.0];edt=[0.0,0.0];gedt=[0.0,0.0]; f=[0.0,0.0]
for j in range(2):
temp=self.b[j]-1j*self.rho*self.sigma*phi
d[j]=cmath.sqrt(temp**2-self.sigma**2*(2.0*self.u[j]*phi*1j-phi**2))
g[j]=(temp+d[j])/(temp-d[j])
edt[j]=cmath.exp(d[j]*self.tau)
gedt[j]=1.0-g[j]*edt[j]
D[j]=(temp+d[j])*(1.0-edt[j])/gedt[j]/self.sigma/self.sigma
C[j]=self.r*phi*self.tau*1j+self.a/self.sigma/self.sigma*((temp+d[j])*self.tau-2.0*cmath.log(gedt[j]/(1.0-g[j])))
f[j]=cmath.exp(C[j]+D[j]*self.v0+1j*phi*self.x0)
return f
#f1 only using a copy of the previous code with minimal change, i.e.,j=0 replaces loop
def f1(self,phi):
d=[0.0,0.0];g=[0.0,0.0];C=[0.0,0.0];D=[0.0,0.0];edt=[0.0,0.0];gedt=[0.0,0.0]; f=[0.0,0.0]
j=0
temp=self.b[j]-1j*self.rho*self.sigma*phi
d[j]=cmath.sqrt(temp**2-self.sigma**2*(2.0*self.u[j]*phi*1j-phi**2))
g[j]=(temp+d[j])/(temp-d[j])
edt[j]=cmath.exp(d[j]*self.tau)
gedt[j]=1.0-g[j]*edt[j]
D[j]=(temp+d[j])*(1.0-edt[j])/gedt[j]/self.sigma/self.sigma
C[j]=self.r*phi*self.tau*1j+self.a/self.sigma/self.sigma*((temp+d[j])*self.tau-2.0*cmath.log(gedt[j]/(1.0-g[j])))
f[j]=cmath.exp(C[j]+D[j]*self.v0+1j*phi*self.x0)
return f[0]
# f2 only using a copy of the previous code with minimal change, i.e.,now j=1 replaces loop
def f2(self,phi):
d=[0.0,0.0];g=[0.0,0.0];C=[0.0,0.0];D=[0.0,0.0];edt=[0.0,0.0];gedt=[0.0,0.0]; f=[0.0,0.0]
j=1
temp=self.b[j]-1j*self.rho*self.sigma*phi
d[j]=cmath.sqrt(temp**2-self.sigma**2*(2.0*self.u[j]*phi*1j-phi**2))
g[j]=(temp+d[j])/(temp-d[j])
edt[j]=cmath.exp(d[j]*self.tau)
gedt[j]=1.0-g[j]*edt[j]
D[j]=(temp+d[j])*(1.0-edt[j])/gedt[j]/self.sigma/self.sigma
C[j]=self.r*phi*self.tau*1j+self.a/self.sigma/self.sigma*((temp+d[j])*self.tau-2.0*cmath.log(gedt[j]/(1.0-g[j])))
f[j]=cmath.exp(C[j]+D[j]*self.v0+1j*phi*self.x0)
return f[1]
#Returns the integrand that appears in the P1 formula
def P1_integrand(self,phi):
temp=cmath.exp(-1j*phi*self.ln_k)*self.f1(phi)/1j/phi
return temp.real
#Returns the integrand that appears in the P2 formula
def P2_integrand(self,phi):
temp=cmath.exp(-1j*phi*self.ln_k)*self.f2(phi)/1j/phi
return temp.real
#Compute the two probabilities: a and b are the integration limits, n is the number of intervals. Usually the interval >0 to 100 captures the range that matters, so no need to go to b=infinity!
def Probabilities(self,a,b,n):
pi_i=1.0/math.pi
P1=0.5+pi_i*trapzd(self.P1_integrand,a,b,n) #trapzd function is defined later
P2=0.5+pi_i*trapzd(self.P2_integrand,a,b,n)
P=[P1,P2]
return P
def price(self,a,b,n):
Ps=self.Probabilities(a,b,n)
call_price=math.exp(self.x0)*Ps[0]-math.exp(self.ln_k-self.r*self.tau)*Ps[1]
return call_price
# Plot real parts of the characteristic functions (f1 and f2), and the integrands that appear in P1 and P2
def plot_f1f2(self):
n=2000
lwr=-50.111
upr=50.0311
x=np.linspace(lwr,upr,n+1)
fs=[self.characteristic_func(x[i]) for i in range(n+1)]
y1=[fs[i][0].real for i in range(n+1)]
y2=[fs[i][0].imag for i in range(n+1)]
y3=[self.P1_integrand(x[i]) for i in range(n+1)]
y4=[fs[i][1].real for i in range(n+1)]
y5=[fs[i][1].imag for i in range(n+1)]
y6=[self.P2_integrand(x[i]) for i in range(n+1)]
fig=plt.figure()
f1_real=fig.add_subplot(231)
f1_real.set_title('Real part of F1')
f1_real.plot(x,y1)
f1_imag=fig.add_subplot(232)
f1_imag.set_title('Imaginary part of F1')
f1_imag.plot(x,y2)
f1_integrand=fig.add_subplot(233)
f1_integrand.set_title('Integrand of P1')
f1_integrand.plot(x,y3)
f2_real=fig.add_subplot(234)
f2_real.set_title('Real part of F2')
f2_real.plot(x,y4)
f2_imag=fig.add_subplot(235)
f2_imag.set_title('Imaginary part of F2')
f2_imag.plot(x,y5)
f2_integrand=fig.add_subplot(236)
f2_integrand.set_title('Integrand of P2')
f2_integrand.plot(x,y6)
plt.show()
#end class
#Defining Trapzoid method for numerical integration, one can also use a function from scipy.integrate library
def trapzd(func,a,b,n):
if (n<1):
return
elif (n==1):
return 0.5*(b-a)*(func(a)+func(b))
else:
temp=0.0
dx=(b-a)/n
x=np.linspace(a,b,n+1)
y=[func(x[i]) for i in range(n+1)]
temp=0.5*dx*np.sum(y[1:]+ y[:-1])
return temp
# Example usage
hc=Heston(S0=100,K=100,tau=1.0,r=0.03,kappa=1.5768,theta=0.0398,v0=0.1,lamda=0,sigma=0.3,rho=-0.5711)
call_price=hc.price(0.00001,100,10000)
print("The call price of the European option is:", call_price)
The call price of the European option is: 11.671852351675497
1.2 Generating samples and splitting them into training & testing datasets¶
For the purpose of generating samples, Latin Hypercube Sampling (LHS) method has been employed. LHS is a stratified sampling method used for generating a near-random sample of parameter values from a multidimensional distribution. One algorithm for generating the samples is to generate D random permutations of the integers 0 through N-1, where D is the number of dimensions and N is the desired number of samples. Concatenating these together into a DxN matrix gives a list of coordinates which will form a Latin hypercube.
In the present case, LHS has been employed to generate 500,000 samples for each of the 9 variables of Heston model (4 market-observable inputs i.e. $S_0$, $K$, $\tau\$, $r$ and 5 parameters). The 10th variable i.e. $\lambda$ is set to 0. Then, using the generated values, Heston call option price is calculated by calling functions from Heston class as given in 1.1.
Each of the 500,000 samples is now a set that consists of individual values for the variables and respective Heston call option price.
Following table gives the detail on sampling ranges chosen for each of the variables:
| Variables | Description | Range / Value | |-----------|------------------------------|----------------------| | $S_0$ | Spot price | (50, 550) | | $K$ | Strike price | (5, 300) | | $\tau$ | Time to maturity | (0.05, 3) | | $r$ | Risk-free interest rate | (0.01, 0.05) | | $v_0$ | Initial volatility | (0.05, 0.5) | | $\kappa$ | Mean reversion rate | (0, 3) | | $\theta$ | Long-term volatility | (0.01, 0.5) | | $\sigma$ | Volatility of volatility | (0.01, 0.8) | | $\rho$ | Correlation coefficient | (-0.9, 0) | | $\lambda$ | Variance risk premium | 0 |
Note : Effectively, the synthetic market data is generated such that each Option price has a known value for market-observable inputs as well as the parameters of the Heston Model. Since LHS generates near-random sample, it could be void of noise which can be rather helpful in training the model efficiently. To that end, an element of noise was deliberately added to the call option price as follows: $$ \widetilde{C} = C + \epsilon $$
where,
$$ \epsilon \sim \mathcal{N}(0, \widetilde{\sigma}^2) \quad \text{and} \quad \widetilde{\sigma} = 0.01\% \text{ of } C $$
Here, the noise is a random number with mean 0 and standard deviation equal to 0.01% of the calculated option price. The standard deviation is chosen to be a percentage of the price to ensure that the element of noise is commensurate with the magnitude of the price. For instance, an arbitrary absolute number, say 1$ would be a lot of noise if the price of the option is only $10. With the addition of noise, synthetic market data can be used to train the neural network efficiently.
Before training the neural network, the dataset is split into three subsets: training (80%), validation (10%), and testing (10%). This process is essential to ensure that the model is trained, validated, and tested on distinct data, allowing for a proper evaluation of its performance and generalization capability.
The following code-block generates the samples using LHS, calculates the Heston Price as per Part 1.1, adds the element of noise, splits the generated data into subsets and visualizes the distribution of calculated prices. The additional python libraries used for this purpose are:
- pyDOE: A library for Design of Experiments, including tools like Latin Hypercube Sampling (LHS).
- seaborn: A statistical data visualization library based on matplotlib, with built-in themes and complex plots.
- sklearn.model_selection: Provides tools for splitting data into training and testing sets, cross-validation, and hyperparameter tuning.
import numpy as np
from pyDOE import lhs
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.model_selection import train_test_split
# Refined parameter ranges
param_ranges = {
'S0': (50, 550),
'K': (5, 300),
'tau': (0.05, 3),
'r': (0.01, 0.05),
'v0': (0.05, 0.5),
'kappa': (0, 3),
'theta': (0.01, 0.5),
'sigma': (0.01, 0.8),
'rho': (-0.9, 0)
}
# Number of samples
n_samples = 500000
# Generate Latin Hypercube Sampling
lhs_samples = lhs(len(param_ranges), samples=n_samples)
# Scale samples to the parameter ranges
params = {}
for i, key in enumerate(param_ranges):
low, high = param_ranges[key]
params[key] = lhs_samples[:, i] * (high - low) + low
# Combine parameters into a single array
param_sets = np.array([params[key] for key in param_ranges]).T
# Function to add noise to the price
def add_noise(price, scale=0.0001):
if price < 0:
price = 0 # Ensure price is not negative
noise_std = scale * price
noise = np.random.normal(0, noise_std)
return price + noise
# Create a list to store option prices and parameter sets with added noise
price_param_list = []
# Calculate option prices for each set of parameters
for param_set in param_sets:
S0, K, tau, r, v0, kappa, theta, sigma, rho = param_set
lambd = 0 # Lambda is set to 0
heston_model = Heston(S0, K, tau, r, kappa, theta, v0, lambd, sigma, rho)
price = heston_model.price(0.00001, 100, 10000)
# Add noise to the price
noisy_price = add_noise(price)
price_param_list.append((noisy_price, param_set))
# Prepare input and output datasets for the neural network
inputs = np.array([[price, S0, K, tau, r, lambd] for price, (S0, K, tau, r, v0, kappa, theta, sigma, rho) in price_param_list])
outputs = np.array([[kappa, theta, v0, sigma, rho] for _, (_, _, _, _, v0, kappa, theta, sigma, rho) in price_param_list])
# Split the dataset
X_train, X_temp, y_train, y_temp = train_test_split(inputs, outputs, test_size=0.2, random_state=42)
X_val, X_test, y_val, y_test = train_test_split(X_temp, y_temp, test_size=0.5, random_state=42)
print(f"X_train shape: {X_train.shape}, y_train shape: {y_train.shape}")
print(f"X_val shape: {X_val.shape}, y_val shape: {y_val.shape}")
print(f"X_test shape: {X_test.shape}, y_test shape: {y_test.shape}")
# Visualization of the option prices
option_prices = [item[0] for item in price_param_list]
plt.figure(figsize=(12, 6))
sns.histplot(option_prices, bins=50, kde=True)
plt.xlabel('Option Price')
plt.ylabel('Frequency')
plt.title('Distribution of Heston Model Option Prices')
plt.show()
X_train shape: (400000, 6), y_train shape: (400000, 5) X_val shape: (50000, 6), y_val shape: (50000, 5) X_test shape: (50000, 6), y_test shape: (50000, 5)
1.3 Saving the sampled data¶
- Since there is a large number of samples being generated, it takes a significant amount of computation time. Also, the samples might change if the program is run again and again. Therefore, it is a good practice to save the sampled data in separate files and use those files for further steps in the project.
# Save the sampled data :
import numpy as np
from pyDOE import lhs
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.model_selection import train_test_split
import numpy as np
# Save the input and output datasets
np.save('inputs.npy', inputs)
np.save('outputs.npy', outputs)
np.save('X_train.npy', X_train)
np.save('X_val.npy', X_val)
np.save('X_test.npy', X_test)
np.save('y_train.npy', y_train)
np.save('y_val.npy', y_val)
np.save('y_test.npy', y_test)
# Load the datasets
inputs = np.load('inputs.npy')
outputs = np.load('outputs.npy')
X_train = np.load('X_train.npy')
X_val = np.load('X_val.npy')
X_test = np.load('X_test.npy')
y_train = np.load('y_train.npy')
y_val = np.load('y_val.npy')
y_test = np.load('y_test.npy')
Part 2 - Training the Neural Network¶
The architecture of the neural network model is as follows:
- Input Layer: 6 features — Option price, $S_0$, $K$, $\tau$, $r$, and $\lambda$.
- Hidden Layers: 3 fully connected (Dense) layers, each with 200 neurons and ReLU activation function.
- Output Layer: 5 neurons corresponding to the predicted Heston model parameters — $\kappa$, $\theta$, $\sigma$, $v_0$, and $\rho$.
The model is compiled using the Adam optimizer, and the loss function is the mean squared error (MSE), defined as:
$$ \text{MSE} = \frac{1}{n} \sum_{i=1}^{n} (y_i - \hat{y}_i)^2 $$
The model is trained for 1000 epochs with a batch size of 1024.
The neural network is implemented in Python using TensorFlow's Keras package. The library imports are explained as follows:
- tensorflow: An end-to-end open-source platform for machine learning and deep learning.
- tensorflow.keras.models.Sequential: A linear stack of layers used to build neural networks in Keras.
- tensorflow.keras.layers.Dense: A fully connected neural network layer where each neuron receives input from all neurons of the previous layer.
- tensorflow.keras.optimizers.Adam: An adaptive learning rate optimization algorithm widely used for training deep learning models.
- tensorflow.keras.callbacks.ReduceLROnPlateau: Callback that reduces the learning rate when a metric has stopped improving.
- tensorflow.keras.callbacks.EarlyStopping: Callback that stops training when a monitored metric has stopped improving to prevent overfitting.
The training and validation lossess are also calculated and plotted on a graph.
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.callbacks import ReduceLROnPlateau, EarlyStopping
import matplotlib.pyplot as plt
# Define the model for the forward pass
model = Sequential()
model.add(Dense(200, input_dim=6, activation='relu')) # 6 input features: Option price, S0, K, tau, r, lambd
for _ in range(3):
model.add(Dense(200, activation='relu'))
model.add(Dense(5)) # Output 5 parameters: kappa, theta, sigma, rho, v0
# Compile the model
model.compile(optimizer=Adam(), loss='mean_squared_error')
# Define learning rate scheduler and early stopping
class CustomEarlyStopping(EarlyStopping):
def on_epoch_end(self, epoch, logs=None):
super().on_epoch_end(epoch, logs)
if self.stopped_epoch > 0:
print(f"Restoring model weights from the end of the best epoch: {self.best_epoch}")
lr_scheduler = ReduceLROnPlateau(monitor='val_loss', factor=0.5, patience=500, min_lr=1e-6)
early_stopping = CustomEarlyStopping(monitor='val_loss', mode='min', verbose=1, patience=500, restore_best_weights=True)
# Train the model
history = model.fit(
X_train, y_train,
epochs=1000,
batch_size=1024,
validation_data=(X_val, y_val),
callbacks=[lr_scheduler, early_stopping],
verbose=1
)
# Evaluate the model
train_mse = model.evaluate(X_train, y_train, verbose=0)
test_mse = model.evaluate(X_test, y_test, verbose=0)
print(f'Training MSE: {train_mse}')
print(f'Testing MSE: {test_mse}')
# Extract data from epoch 50 onward
start_epoch = 50
total_epochs = len(history.history['loss'])
epochs = range(start_epoch, total_epochs + 1)
train_loss = history.history['loss'][start_epoch - 1:]
val_loss = history.history['val_loss'][start_epoch - 1:]
# Plot the training loss vs validation loss starting from epoch 50
plt.figure(figsize=(10, 6))
plt.plot(epochs, train_loss, label='Training Loss')
plt.plot(epochs, val_loss, label='Validation Loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.title('Training and Validation Loss')
plt.legend()
plt.grid(True)
plt.show()
Epoch 1/1000 391/391 [==============================] - 2s 4ms/step - loss: 1.4074 - val_loss: 0.2174 - lr: 0.0010 Epoch 2/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.2119 - val_loss: 0.2110 - lr: 0.0010 Epoch 3/1000 391/391 [==============================] - 1s 4ms/step - loss: 0.2023 - val_loss: 0.1876 - lr: 0.0010 Epoch 4/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1968 - val_loss: 0.1838 - lr: 0.0010 Epoch 5/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1917 - val_loss: 0.1937 - lr: 0.0010 Epoch 6/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1909 - val_loss: 0.1831 - lr: 0.0010 Epoch 7/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1890 - val_loss: 0.1865 - lr: 0.0010 Epoch 8/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1866 - val_loss: 0.1919 - lr: 0.0010 Epoch 9/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1869 - val_loss: 0.1869 - lr: 0.0010 Epoch 10/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1849 - val_loss: 0.1822 - lr: 0.0010 Epoch 11/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1849 - val_loss: 0.1820 - lr: 0.0010 Epoch 12/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1832 - val_loss: 0.1826 - lr: 0.0010 Epoch 13/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1829 - val_loss: 0.1804 - lr: 0.0010 Epoch 14/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1823 - val_loss: 0.1804 - lr: 0.0010 Epoch 15/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1823 - val_loss: 0.1840 - lr: 0.0010 Epoch 16/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1815 - val_loss: 0.1798 - lr: 0.0010 Epoch 17/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1814 - val_loss: 0.1804 - lr: 0.0010 Epoch 18/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1812 - val_loss: 0.1800 - lr: 0.0010 Epoch 19/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1807 - val_loss: 0.1802 - lr: 0.0010 Epoch 20/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1806 - val_loss: 0.1805 - lr: 0.0010 Epoch 21/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1803 - val_loss: 0.1804 - lr: 0.0010 Epoch 22/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1803 - val_loss: 0.1794 - lr: 0.0010 Epoch 23/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1802 - val_loss: 0.1790 - lr: 0.0010 Epoch 24/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1802 - val_loss: 0.1803 - lr: 0.0010 Epoch 25/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1801 - val_loss: 0.1811 - lr: 0.0010 Epoch 26/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1803 - val_loss: 0.1796 - lr: 0.0010 Epoch 27/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1799 - val_loss: 0.1791 - lr: 0.0010 Epoch 28/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1800 - val_loss: 0.1790 - lr: 0.0010 Epoch 29/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1801 - val_loss: 0.1798 - lr: 0.0010 Epoch 30/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1803 - val_loss: 0.1799 - lr: 0.0010 Epoch 31/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1800 - val_loss: 0.1787 - lr: 0.0010 Epoch 32/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1797 - val_loss: 0.1787 - lr: 0.0010 Epoch 33/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1798 - val_loss: 0.1788 - lr: 0.0010 Epoch 34/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1796 - val_loss: 0.1788 - lr: 0.0010 Epoch 35/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1796 - val_loss: 0.1796 - lr: 0.0010 Epoch 36/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1795 - val_loss: 0.1790 - lr: 0.0010 Epoch 37/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1794 - val_loss: 0.1783 - lr: 0.0010 Epoch 38/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1793 - val_loss: 0.1800 - lr: 0.0010 Epoch 39/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1793 - val_loss: 0.1781 - lr: 0.0010 Epoch 40/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1792 - val_loss: 0.1780 - lr: 0.0010 Epoch 41/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1792 - val_loss: 0.1782 - lr: 0.0010 Epoch 42/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1790 - val_loss: 0.1782 - lr: 0.0010 Epoch 43/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1791 - val_loss: 0.1779 - lr: 0.0010 Epoch 44/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1790 - val_loss: 0.1782 - lr: 0.0010 Epoch 45/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1790 - val_loss: 0.1801 - lr: 0.0010 Epoch 46/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1790 - val_loss: 0.1782 - lr: 0.0010 Epoch 47/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1789 - val_loss: 0.1777 - lr: 0.0010 Epoch 48/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1789 - val_loss: 0.1778 - lr: 0.0010 Epoch 49/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1789 - val_loss: 0.1779 - lr: 0.0010 Epoch 50/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1788 - val_loss: 0.1787 - lr: 0.0010 Epoch 51/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1787 - val_loss: 0.1776 - lr: 0.0010 Epoch 52/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1787 - val_loss: 0.1782 - lr: 0.0010 Epoch 53/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1788 - val_loss: 0.1780 - lr: 0.0010 Epoch 54/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1787 - val_loss: 0.1777 - lr: 0.0010 Epoch 55/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1786 - val_loss: 0.1774 - lr: 0.0010 Epoch 56/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1788 - val_loss: 0.1784 - lr: 0.0010 Epoch 57/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1785 - val_loss: 0.1778 - lr: 0.0010 Epoch 58/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1787 - val_loss: 0.1787 - lr: 0.0010 Epoch 59/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1786 - val_loss: 0.1776 - lr: 0.0010 Epoch 60/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1785 - val_loss: 0.1776 - lr: 0.0010 Epoch 61/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1784 - val_loss: 0.1787 - lr: 0.0010 Epoch 62/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1784 - val_loss: 0.1773 - lr: 0.0010 Epoch 63/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1783 - val_loss: 0.1783 - lr: 0.0010 Epoch 64/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1784 - val_loss: 0.1773 - lr: 0.0010 Epoch 65/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1783 - val_loss: 0.1772 - lr: 0.0010 Epoch 66/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1782 - val_loss: 0.1773 - lr: 0.0010 Epoch 67/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1782 - val_loss: 0.1781 - lr: 0.0010 Epoch 68/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1783 - val_loss: 0.1774 - lr: 0.0010 Epoch 69/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1781 - val_loss: 0.1773 - lr: 0.0010 Epoch 70/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1782 - val_loss: 0.1808 - lr: 0.0010 Epoch 71/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1783 - val_loss: 0.1778 - lr: 0.0010 Epoch 72/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1783 - val_loss: 0.1778 - lr: 0.0010 Epoch 73/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1781 - val_loss: 0.1779 - lr: 0.0010 Epoch 74/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1780 - val_loss: 0.1771 - lr: 0.0010 Epoch 75/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1781 - val_loss: 0.1775 - lr: 0.0010 Epoch 76/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1780 - val_loss: 0.1770 - lr: 0.0010 Epoch 77/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1780 - val_loss: 0.1775 - lr: 0.0010 Epoch 78/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1781 - val_loss: 0.1787 - lr: 0.0010 Epoch 79/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1780 - val_loss: 0.1774 - lr: 0.0010 Epoch 80/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1780 - val_loss: 0.1770 - lr: 0.0010 Epoch 81/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1782 - val_loss: 0.1774 - lr: 0.0010 Epoch 82/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1780 - val_loss: 0.1772 - lr: 0.0010 Epoch 83/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1781 - val_loss: 0.1775 - lr: 0.0010 Epoch 84/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1780 - val_loss: 0.1782 - lr: 0.0010 Epoch 85/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1779 - val_loss: 0.1768 - lr: 0.0010 Epoch 86/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1779 - val_loss: 0.1777 - lr: 0.0010 Epoch 87/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1778 - val_loss: 0.1771 - lr: 0.0010 Epoch 88/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1780 - val_loss: 0.1770 - lr: 0.0010 Epoch 89/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1780 - val_loss: 0.1769 - lr: 0.0010 Epoch 90/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1779 - val_loss: 0.1772 - lr: 0.0010 Epoch 91/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1779 - val_loss: 0.1771 - lr: 0.0010 Epoch 92/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1779 - val_loss: 0.1780 - lr: 0.0010 Epoch 93/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1778 - val_loss: 0.1769 - lr: 0.0010 Epoch 94/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1779 - val_loss: 0.1774 - lr: 0.0010 Epoch 95/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1779 - val_loss: 0.1775 - lr: 0.0010 Epoch 96/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1778 - val_loss: 0.1768 - lr: 0.0010 Epoch 97/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1777 - val_loss: 0.1768 - lr: 0.0010 Epoch 98/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1778 - val_loss: 0.1768 - lr: 0.0010 Epoch 99/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1782 - val_loss: 0.1773 - lr: 0.0010 Epoch 100/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1777 - val_loss: 0.1771 - lr: 0.0010 Epoch 101/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1777 - val_loss: 0.1769 - lr: 0.0010 Epoch 102/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1777 - val_loss: 0.1793 - lr: 0.0010 Epoch 103/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1779 - val_loss: 0.1770 - lr: 0.0010 Epoch 104/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1777 - val_loss: 0.1772 - lr: 0.0010 Epoch 105/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1778 - val_loss: 0.1777 - lr: 0.0010 Epoch 106/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1777 - val_loss: 0.1774 - lr: 0.0010 Epoch 107/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1778 - val_loss: 0.1770 - lr: 0.0010 Epoch 108/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1778 - val_loss: 0.1767 - lr: 0.0010 Epoch 109/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1778 - val_loss: 0.1775 - lr: 0.0010 Epoch 110/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1778 - val_loss: 0.1771 - lr: 0.0010 Epoch 111/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1778 - val_loss: 0.1769 - lr: 0.0010 Epoch 112/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1778 - val_loss: 0.1773 - lr: 0.0010 Epoch 113/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1777 - val_loss: 0.1770 - lr: 0.0010 Epoch 114/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1778 - val_loss: 0.1767 - lr: 0.0010 Epoch 115/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1782 - val_loss: 0.1786 - lr: 0.0010 Epoch 116/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1782 - val_loss: 0.1770 - lr: 0.0010 Epoch 117/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1778 - val_loss: 0.1782 - lr: 0.0010 Epoch 118/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1777 - val_loss: 0.1769 - lr: 0.0010 Epoch 119/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1777 - val_loss: 0.1770 - lr: 0.0010 Epoch 120/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1777 - val_loss: 0.1769 - lr: 0.0010 Epoch 121/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1776 - val_loss: 0.1766 - lr: 0.0010 Epoch 122/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1777 - val_loss: 0.1766 - lr: 0.0010 Epoch 123/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1777 - val_loss: 0.1776 - lr: 0.0010 Epoch 124/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1777 - val_loss: 0.1767 - lr: 0.0010 Epoch 125/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1777 - val_loss: 0.1774 - lr: 0.0010 Epoch 126/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1775 - lr: 0.0010 Epoch 127/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1773 - lr: 0.0010 Epoch 128/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1777 - val_loss: 0.1774 - lr: 0.0010 Epoch 129/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1774 - lr: 0.0010 Epoch 130/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1777 - val_loss: 0.1768 - lr: 0.0010 Epoch 131/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1774 - lr: 0.0010 Epoch 132/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1775 - val_loss: 0.1764 - lr: 0.0010 Epoch 133/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1774 - lr: 0.0010 Epoch 134/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1768 - lr: 0.0010 Epoch 135/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1776 - val_loss: 0.1768 - lr: 0.0010 Epoch 136/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1766 - lr: 0.0010 Epoch 137/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1777 - val_loss: 0.1768 - lr: 0.0010 Epoch 138/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1773 - lr: 0.0010 Epoch 139/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1767 - lr: 0.0010 Epoch 140/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1766 - lr: 0.0010 Epoch 141/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1767 - lr: 0.0010 Epoch 142/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1767 - lr: 0.0010 Epoch 143/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1775 - val_loss: 0.1765 - lr: 0.0010 Epoch 144/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1768 - lr: 0.0010 Epoch 145/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1774 - val_loss: 0.1775 - lr: 0.0010 Epoch 146/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1775 - val_loss: 0.1767 - lr: 0.0010 Epoch 147/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1766 - lr: 0.0010 Epoch 148/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1771 - lr: 0.0010 Epoch 149/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1775 - val_loss: 0.1769 - lr: 0.0010 Epoch 150/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1765 - lr: 0.0010 Epoch 151/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1770 - lr: 0.0010 Epoch 152/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1765 - lr: 0.0010 Epoch 153/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1774 - val_loss: 0.1769 - lr: 0.0010 Epoch 154/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1775 - val_loss: 0.1765 - lr: 0.0010 Epoch 155/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1775 - val_loss: 0.1766 - lr: 0.0010 Epoch 156/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1776 - val_loss: 0.1777 - lr: 0.0010 Epoch 157/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1775 - val_loss: 0.1769 - lr: 0.0010 Epoch 158/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1774 - val_loss: 0.1765 - lr: 0.0010 Epoch 159/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1774 - val_loss: 0.1785 - lr: 0.0010 Epoch 160/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1775 - val_loss: 0.1763 - lr: 0.0010 Epoch 161/1000 391/391 [==============================] - 2s 4ms/step - loss: 0.1775 - val_loss: 0.1767 - lr: 0.0010 Epoch 162/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1771 - lr: 0.0010 Epoch 163/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1768 - lr: 0.0010 Epoch 164/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1764 - lr: 0.0010 Epoch 165/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1762 - lr: 0.0010 Epoch 166/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1767 - lr: 0.0010 Epoch 167/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1770 - lr: 0.0010 Epoch 168/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1769 - lr: 0.0010 Epoch 169/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1766 - lr: 0.0010 Epoch 170/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1767 - lr: 0.0010 Epoch 171/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1767 - lr: 0.0010 Epoch 172/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1767 - lr: 0.0010 Epoch 173/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1763 - lr: 0.0010 Epoch 174/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1768 - lr: 0.0010 Epoch 175/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1767 - lr: 0.0010 Epoch 176/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1769 - lr: 0.0010 Epoch 177/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1764 - lr: 0.0010 Epoch 178/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1763 - lr: 0.0010 Epoch 179/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1776 - val_loss: 0.1772 - lr: 0.0010 Epoch 180/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1767 - lr: 0.0010 Epoch 181/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1770 - lr: 0.0010 Epoch 182/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1766 - lr: 0.0010 Epoch 183/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1771 - lr: 0.0010 Epoch 184/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1767 - lr: 0.0010 Epoch 185/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1775 - lr: 0.0010 Epoch 186/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1763 - lr: 0.0010 Epoch 187/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1770 - lr: 0.0010 Epoch 188/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1766 - lr: 0.0010 Epoch 189/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1766 - lr: 0.0010 Epoch 190/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1769 - lr: 0.0010 Epoch 191/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 192/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 193/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1766 - lr: 0.0010 Epoch 194/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1765 - lr: 0.0010 Epoch 195/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1773 - lr: 0.0010 Epoch 196/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1769 - lr: 0.0010 Epoch 197/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1775 - val_loss: 0.1763 - lr: 0.0010 Epoch 198/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1766 - lr: 0.0010 Epoch 199/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1763 - lr: 0.0010 Epoch 200/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1764 - lr: 0.0010 Epoch 201/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1768 - lr: 0.0010 Epoch 202/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 203/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1768 - lr: 0.0010 Epoch 204/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1766 - lr: 0.0010 Epoch 205/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1766 - lr: 0.0010 Epoch 206/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1775 - lr: 0.0010 Epoch 207/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1765 - lr: 0.0010 Epoch 208/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1762 - lr: 0.0010 Epoch 209/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1765 - lr: 0.0010 Epoch 210/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1765 - lr: 0.0010 Epoch 211/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1766 - lr: 0.0010 Epoch 212/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1774 - val_loss: 0.1784 - lr: 0.0010 Epoch 213/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1770 - lr: 0.0010 Epoch 214/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 215/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1768 - lr: 0.0010 Epoch 216/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1764 - lr: 0.0010 Epoch 217/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1763 - lr: 0.0010 Epoch 218/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1768 - lr: 0.0010 Epoch 219/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1765 - lr: 0.0010 Epoch 220/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1767 - lr: 0.0010 Epoch 221/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1771 - val_loss: 0.1764 - lr: 0.0010 Epoch 222/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1761 - lr: 0.0010 Epoch 223/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 224/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1767 - lr: 0.0010 Epoch 225/1000 391/391 [==============================] - 2s 5ms/step - loss: 0.1773 - val_loss: 0.1764 - lr: 0.0010 Epoch 226/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1765 - lr: 0.0010 Epoch 227/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1765 - lr: 0.0010 Epoch 228/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1765 - lr: 0.0010 Epoch 229/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1767 - lr: 0.0010 Epoch 230/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1773 - val_loss: 0.1767 - lr: 0.0010 Epoch 231/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1766 - lr: 0.0010 Epoch 232/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1774 - val_loss: 0.1773 - lr: 0.0010 Epoch 233/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1774 - val_loss: 0.1771 - lr: 0.0010 Epoch 234/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1775 - val_loss: 0.1766 - lr: 0.0010 Epoch 235/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 236/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 237/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1762 - lr: 0.0010 Epoch 238/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1762 - lr: 0.0010 Epoch 239/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1765 - lr: 0.0010 Epoch 240/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1763 - lr: 0.0010 Epoch 241/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1775 - val_loss: 0.1765 - lr: 0.0010 Epoch 242/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1763 - lr: 0.0010 Epoch 243/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1771 - lr: 0.0010 Epoch 244/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1771 - lr: 0.0010 Epoch 245/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1766 - lr: 0.0010 Epoch 246/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 247/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1766 - lr: 0.0010 Epoch 248/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1769 - lr: 0.0010 Epoch 249/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1767 - lr: 0.0010 Epoch 250/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1762 - lr: 0.0010 Epoch 251/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 252/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1763 - lr: 0.0010 Epoch 253/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1766 - lr: 0.0010 Epoch 254/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1763 - lr: 0.0010 Epoch 255/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1763 - lr: 0.0010 Epoch 256/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1774 - val_loss: 0.1770 - lr: 0.0010 Epoch 257/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1778 - lr: 0.0010 Epoch 258/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1764 - lr: 0.0010 Epoch 259/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1769 - lr: 0.0010 Epoch 260/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1760 - lr: 0.0010 Epoch 261/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1766 - lr: 0.0010 Epoch 262/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1773 - val_loss: 0.1766 - lr: 0.0010 Epoch 263/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1763 - lr: 0.0010 Epoch 264/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 265/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1767 - lr: 0.0010 Epoch 266/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1768 - lr: 0.0010 Epoch 267/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 268/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1773 - val_loss: 0.1767 - lr: 0.0010 Epoch 269/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1765 - lr: 0.0010 Epoch 270/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1760 - lr: 0.0010 Epoch 271/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1761 - lr: 0.0010 Epoch 272/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1773 - lr: 0.0010 Epoch 273/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1773 - val_loss: 0.1767 - lr: 0.0010 Epoch 274/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1774 - val_loss: 0.1763 - lr: 0.0010 Epoch 275/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1762 - lr: 0.0010 Epoch 276/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1769 - val_loss: 0.1762 - lr: 0.0010 Epoch 277/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1768 - lr: 0.0010 Epoch 278/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1764 - lr: 0.0010 Epoch 279/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1778 - lr: 0.0010 Epoch 280/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1763 - lr: 0.0010 Epoch 281/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1765 - lr: 0.0010 Epoch 282/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1768 - lr: 0.0010 Epoch 283/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1761 - lr: 0.0010 Epoch 284/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1764 - lr: 0.0010 Epoch 285/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1761 - lr: 0.0010 Epoch 286/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1763 - lr: 0.0010 Epoch 287/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1762 - lr: 0.0010 Epoch 288/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1774 - val_loss: 0.1767 - lr: 0.0010 Epoch 289/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1768 - lr: 0.0010 Epoch 290/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1766 - lr: 0.0010 Epoch 291/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1765 - lr: 0.0010 Epoch 292/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1762 - lr: 0.0010 Epoch 293/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1763 - lr: 0.0010 Epoch 294/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1764 - lr: 0.0010 Epoch 295/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1783 - lr: 0.0010 Epoch 296/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1770 - lr: 0.0010 Epoch 297/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1769 - val_loss: 0.1766 - lr: 0.0010 Epoch 298/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1765 - lr: 0.0010 Epoch 299/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1765 - lr: 0.0010 Epoch 300/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1769 - val_loss: 0.1763 - lr: 0.0010 Epoch 301/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1766 - lr: 0.0010 Epoch 302/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1763 - lr: 0.0010 Epoch 303/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1760 - lr: 0.0010 Epoch 304/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1775 - lr: 0.0010 Epoch 305/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1770 - lr: 0.0010 Epoch 306/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1783 - lr: 0.0010 Epoch 307/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1796 - lr: 0.0010 Epoch 308/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1773 - lr: 0.0010 Epoch 309/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1770 - val_loss: 0.1766 - lr: 0.0010 Epoch 310/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1770 - lr: 0.0010 Epoch 311/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1770 - val_loss: 0.1766 - lr: 0.0010 Epoch 312/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1772 - lr: 0.0010 Epoch 313/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1767 - lr: 0.0010 Epoch 314/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1771 - val_loss: 0.1760 - lr: 0.0010 Epoch 315/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1773 - val_loss: 0.1764 - lr: 0.0010 Epoch 316/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1770 - lr: 0.0010 Epoch 317/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1766 - lr: 0.0010 Epoch 318/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1766 - lr: 0.0010 Epoch 319/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1767 - lr: 0.0010 Epoch 320/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1767 - lr: 0.0010 Epoch 321/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 322/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1769 - val_loss: 0.1765 - lr: 0.0010 Epoch 323/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1771 - val_loss: 0.1765 - lr: 0.0010 Epoch 324/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 325/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1768 - lr: 0.0010 Epoch 326/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1771 - lr: 0.0010 Epoch 327/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1770 - val_loss: 0.1773 - lr: 0.0010 Epoch 328/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1771 - val_loss: 0.1767 - lr: 0.0010 Epoch 329/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1769 - val_loss: 0.1765 - lr: 0.0010 Epoch 330/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1771 - val_loss: 0.1762 - lr: 0.0010 Epoch 331/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1769 - val_loss: 0.1762 - lr: 0.0010 Epoch 332/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1770 - val_loss: 0.1765 - lr: 0.0010 Epoch 333/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1770 - lr: 0.0010 Epoch 334/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1771 - lr: 0.0010 Epoch 335/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1770 - val_loss: 0.1774 - lr: 0.0010 Epoch 336/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1770 - val_loss: 0.1767 - lr: 0.0010 Epoch 337/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1763 - lr: 0.0010 Epoch 338/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1772 - val_loss: 0.1765 - lr: 0.0010 Epoch 339/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1776 - lr: 0.0010 Epoch 340/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 341/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1765 - lr: 0.0010 Epoch 342/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1771 - val_loss: 0.1767 - lr: 0.0010 Epoch 343/1000 391/391 [==============================] - 2s 6ms/step - loss: 0.1769 - val_loss: 0.1764 - lr: 0.0010 Epoch 344/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1770 - val_loss: 0.1764 - lr: 0.0010 Epoch 345/1000 391/391 [==============================] - 3s 6ms/step - loss: 0.1769 - val_loss: 0.1766 - lr: 0.0010 Epoch 346/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1764 - lr: 0.0010 Epoch 347/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1761 - lr: 0.0010 Epoch 348/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1767 - lr: 0.0010 Epoch 349/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 350/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1784 - lr: 0.0010 Epoch 351/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1771 - val_loss: 0.1765 - lr: 0.0010 Epoch 352/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1761 - lr: 0.0010 Epoch 353/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1771 - val_loss: 0.1768 - lr: 0.0010 Epoch 354/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1771 - val_loss: 0.1764 - lr: 0.0010 Epoch 355/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1761 - lr: 0.0010 Epoch 356/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1764 - lr: 0.0010 Epoch 357/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1771 - val_loss: 0.1762 - lr: 0.0010 Epoch 358/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1764 - lr: 0.0010 Epoch 359/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1761 - lr: 0.0010 Epoch 360/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 361/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1765 - lr: 0.0010 Epoch 362/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1778 - lr: 0.0010 Epoch 363/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1764 - lr: 0.0010 Epoch 364/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1764 - lr: 0.0010 Epoch 365/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1769 - lr: 0.0010 Epoch 366/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 367/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 368/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 369/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1765 - lr: 0.0010 Epoch 370/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1767 - lr: 0.0010 Epoch 371/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1767 - lr: 0.0010 Epoch 372/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1762 - lr: 0.0010 Epoch 373/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 374/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 375/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1773 - lr: 0.0010 Epoch 376/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1765 - lr: 0.0010 Epoch 377/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1771 - lr: 0.0010 Epoch 378/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1763 - lr: 0.0010 Epoch 379/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 380/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1768 - lr: 0.0010 Epoch 381/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1767 - val_loss: 0.1763 - lr: 0.0010 Epoch 382/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1765 - lr: 0.0010 Epoch 383/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1763 - lr: 0.0010 Epoch 384/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1760 - lr: 0.0010 Epoch 385/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1770 - val_loss: 0.1859 - lr: 0.0010 Epoch 386/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1772 - val_loss: 0.1767 - lr: 0.0010 Epoch 387/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1768 - val_loss: 0.1763 - lr: 0.0010 Epoch 388/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1762 - lr: 0.0010 Epoch 389/1000 391/391 [==============================] - 3s 7ms/step - loss: 0.1769 - val_loss: 0.1771 - lr: 0.0010 Epoch 390/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1769 - val_loss: 0.1767 - lr: 0.0010 Epoch 391/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1771 - val_loss: 0.1763 - lr: 0.0010 Epoch 392/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1769 - val_loss: 0.1772 - lr: 0.0010 Epoch 393/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1770 - val_loss: 0.1782 - lr: 0.0010 Epoch 394/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1769 - val_loss: 0.1794 - lr: 0.0010 Epoch 395/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1768 - val_loss: 0.1772 - lr: 0.0010 Epoch 396/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1770 - val_loss: 0.1763 - lr: 0.0010 Epoch 397/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 398/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1772 - val_loss: 0.1762 - lr: 0.0010 Epoch 399/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 400/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 401/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1770 - val_loss: 0.1764 - lr: 0.0010 Epoch 402/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1769 - val_loss: 0.1767 - lr: 0.0010 Epoch 403/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1760 - lr: 0.0010 Epoch 404/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 405/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1763 - lr: 0.0010 Epoch 406/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1766 - lr: 0.0010 Epoch 407/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1771 - val_loss: 0.1773 - lr: 0.0010 Epoch 408/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1770 - val_loss: 0.1760 - lr: 0.0010 Epoch 409/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 410/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1769 - lr: 0.0010 Epoch 411/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1763 - lr: 0.0010 Epoch 412/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1765 - lr: 0.0010 Epoch 413/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 414/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 415/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1762 - lr: 0.0010 Epoch 416/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 417/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1763 - lr: 0.0010 Epoch 418/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1770 - val_loss: 0.1763 - lr: 0.0010 Epoch 419/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 420/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1764 - lr: 0.0010 Epoch 421/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1772 - val_loss: 0.1764 - lr: 0.0010 Epoch 422/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 423/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1770 - lr: 0.0010 Epoch 424/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 425/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 426/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1769 - val_loss: 0.1765 - lr: 0.0010 Epoch 427/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1769 - val_loss: 0.1761 - lr: 0.0010 Epoch 428/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1763 - lr: 0.0010 Epoch 429/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1763 - lr: 0.0010 Epoch 430/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1769 - val_loss: 0.1762 - lr: 0.0010 Epoch 431/1000 391/391 [==============================] - 4s 9ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 432/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1764 - lr: 0.0010 Epoch 433/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1761 - lr: 0.0010 Epoch 434/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1759 - lr: 0.0010 Epoch 435/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 436/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1763 - lr: 0.0010 Epoch 437/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1792 - lr: 0.0010 Epoch 438/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1770 - val_loss: 0.1761 - lr: 0.0010 Epoch 439/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1760 - lr: 0.0010 Epoch 440/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1771 - val_loss: 0.1762 - lr: 0.0010 Epoch 441/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 442/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1779 - lr: 0.0010 Epoch 443/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1770 - val_loss: 0.1760 - lr: 0.0010 Epoch 444/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1766 - val_loss: 0.1761 - lr: 0.0010 Epoch 445/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1763 - lr: 0.0010 Epoch 446/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1762 - lr: 0.0010 Epoch 447/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1766 - lr: 0.0010 Epoch 448/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1766 - val_loss: 0.1766 - lr: 0.0010 Epoch 449/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1766 - lr: 0.0010 Epoch 450/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 451/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1766 - val_loss: 0.1764 - lr: 0.0010 Epoch 452/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1761 - lr: 0.0010 Epoch 453/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1766 - val_loss: 0.1764 - lr: 0.0010 Epoch 454/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1760 - lr: 0.0010 Epoch 455/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1765 - lr: 0.0010 Epoch 456/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 457/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 458/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 459/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1765 - lr: 0.0010 Epoch 460/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 461/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1765 - lr: 0.0010 Epoch 462/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1768 - lr: 0.0010 Epoch 463/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1763 - lr: 0.0010 Epoch 464/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1764 - lr: 0.0010 Epoch 465/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1761 - lr: 0.0010 Epoch 466/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1763 - lr: 0.0010 Epoch 467/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1764 - lr: 0.0010 Epoch 468/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1760 - lr: 0.0010 Epoch 469/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 470/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1764 - lr: 0.0010 Epoch 471/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1772 - lr: 0.0010 Epoch 472/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1769 - lr: 0.0010 Epoch 473/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1764 - lr: 0.0010 Epoch 474/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1764 - lr: 0.0010 Epoch 475/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 476/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1766 - lr: 0.0010 Epoch 477/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1769 - val_loss: 0.1763 - lr: 0.0010 Epoch 478/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1769 - lr: 0.0010 Epoch 479/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 480/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1759 - lr: 0.0010 Epoch 481/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1769 - lr: 0.0010 Epoch 482/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1763 - lr: 0.0010 Epoch 483/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1760 - lr: 0.0010 Epoch 484/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1766 - lr: 0.0010 Epoch 485/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1769 - lr: 0.0010 Epoch 486/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1769 - val_loss: 0.1763 - lr: 0.0010 Epoch 487/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1766 - lr: 0.0010 Epoch 488/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1762 - lr: 0.0010 Epoch 489/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1768 - val_loss: 0.1765 - lr: 0.0010 Epoch 490/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1768 - val_loss: 0.1768 - lr: 0.0010 Epoch 491/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 492/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 493/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1771 - lr: 0.0010 Epoch 494/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1768 - val_loss: 0.1763 - lr: 0.0010 Epoch 495/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1761 - lr: 0.0010 Epoch 496/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1768 - val_loss: 0.1765 - lr: 0.0010 Epoch 497/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1769 - val_loss: 0.1774 - lr: 0.0010 Epoch 498/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 499/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 500/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1761 - lr: 0.0010 Epoch 501/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 502/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 503/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1760 - lr: 0.0010 Epoch 504/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1766 - val_loss: 0.1764 - lr: 0.0010 Epoch 505/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1766 - val_loss: 0.1761 - lr: 0.0010 Epoch 506/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 507/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1766 - lr: 0.0010 Epoch 508/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 509/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1762 - lr: 0.0010 Epoch 510/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1761 - lr: 0.0010 Epoch 511/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1767 - lr: 0.0010 Epoch 512/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1771 - lr: 0.0010 Epoch 513/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1772 - lr: 0.0010 Epoch 514/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1764 - lr: 0.0010 Epoch 515/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1770 - lr: 0.0010 Epoch 516/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1766 - val_loss: 0.1759 - lr: 0.0010 Epoch 517/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1767 - val_loss: 0.1762 - lr: 0.0010 Epoch 518/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1765 - lr: 0.0010 Epoch 519/1000 391/391 [==============================] - 4s 10ms/step - loss: 0.1768 - val_loss: 0.1763 - lr: 0.0010 Epoch 520/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1775 - lr: 0.0010 Epoch 521/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1768 - val_loss: 0.1759 - lr: 0.0010 Epoch 522/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 523/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1763 - lr: 0.0010 Epoch 524/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 525/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1769 - val_loss: 0.1767 - lr: 0.0010 Epoch 526/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1765 - val_loss: 0.1771 - lr: 0.0010 Epoch 527/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1765 - val_loss: 0.1761 - lr: 0.0010 Epoch 528/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1762 - lr: 0.0010 Epoch 529/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1765 - val_loss: 0.1765 - lr: 0.0010 Epoch 530/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1760 - lr: 0.0010 Epoch 531/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1765 - val_loss: 0.1760 - lr: 0.0010 Epoch 532/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1760 - lr: 0.0010 Epoch 533/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1764 - lr: 0.0010 Epoch 534/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1763 - lr: 0.0010 Epoch 535/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1765 - val_loss: 0.1763 - lr: 0.0010 Epoch 536/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1768 - val_loss: 0.1770 - lr: 0.0010 Epoch 537/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1764 - lr: 0.0010 Epoch 538/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1765 - lr: 0.0010 Epoch 539/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1767 - lr: 0.0010 Epoch 540/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 541/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1769 - lr: 0.0010 Epoch 542/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1765 - val_loss: 0.1778 - lr: 0.0010 Epoch 543/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 544/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1774 - lr: 0.0010 Epoch 545/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1764 - lr: 0.0010 Epoch 546/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1765 - val_loss: 0.1764 - lr: 0.0010 Epoch 547/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 548/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1769 - val_loss: 0.1762 - lr: 0.0010 Epoch 549/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1774 - lr: 0.0010 Epoch 550/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 551/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1764 - lr: 0.0010 Epoch 552/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1766 - lr: 0.0010 Epoch 553/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1764 - lr: 0.0010 Epoch 554/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1766 - val_loss: 0.1779 - lr: 0.0010 Epoch 555/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1763 - lr: 0.0010 Epoch 556/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1760 - lr: 0.0010 Epoch 557/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 558/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1763 - lr: 0.0010 Epoch 559/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 560/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1765 - val_loss: 0.1763 - lr: 0.0010 Epoch 561/1000 391/391 [==============================] - 4s 12ms/step - loss: 0.1766 - val_loss: 0.1765 - lr: 0.0010 Epoch 562/1000 391/391 [==============================] - 4s 12ms/step - loss: 0.1765 - val_loss: 0.1763 - lr: 0.0010 Epoch 563/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1768 - val_loss: 0.1762 - lr: 0.0010 Epoch 564/1000 391/391 [==============================] - 4s 11ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 565/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 566/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1767 - lr: 0.0010 Epoch 567/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1777 - lr: 0.0010 Epoch 568/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1759 - lr: 0.0010 Epoch 569/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1774 - lr: 0.0010 Epoch 570/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 571/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1764 - val_loss: 0.1769 - lr: 0.0010 Epoch 572/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1762 - lr: 0.0010 Epoch 573/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1764 - lr: 0.0010 Epoch 574/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1772 - lr: 0.0010 Epoch 575/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1768 - lr: 0.0010 Epoch 576/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1761 - lr: 0.0010 Epoch 577/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 578/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1770 - lr: 0.0010 Epoch 579/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 580/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1767 - val_loss: 0.1763 - lr: 0.0010 Epoch 581/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1768 - val_loss: 0.1768 - lr: 0.0010 Epoch 582/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1768 - val_loss: 0.1767 - lr: 0.0010 Epoch 583/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1767 - val_loss: 0.1762 - lr: 0.0010 Epoch 584/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1767 - val_loss: 0.1762 - lr: 0.0010 Epoch 585/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1769 - val_loss: 0.1762 - lr: 0.0010 Epoch 586/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1766 - lr: 0.0010 Epoch 587/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1769 - lr: 0.0010 Epoch 588/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1767 - val_loss: 0.1766 - lr: 0.0010 Epoch 589/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1767 - val_loss: 0.1765 - lr: 0.0010 Epoch 590/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1768 - lr: 0.0010 Epoch 591/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 592/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1761 - lr: 0.0010 Epoch 593/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1777 - lr: 0.0010 Epoch 594/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1768 - val_loss: 0.1766 - lr: 0.0010 Epoch 595/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 596/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1767 - lr: 0.0010 Epoch 597/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1763 - lr: 0.0010 Epoch 598/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1766 - lr: 0.0010 Epoch 599/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 600/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 601/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1766 - lr: 0.0010 Epoch 602/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 603/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1765 - lr: 0.0010 Epoch 604/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1768 - val_loss: 0.1764 - lr: 0.0010 Epoch 605/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1763 - lr: 0.0010 Epoch 606/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 607/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1768 - val_loss: 0.1765 - lr: 0.0010 Epoch 608/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1760 - lr: 0.0010 Epoch 609/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 610/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1764 - val_loss: 0.1769 - lr: 0.0010 Epoch 611/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1765 - lr: 0.0010 Epoch 612/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1779 - lr: 0.0010 Epoch 613/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1767 - lr: 0.0010 Epoch 614/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 615/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1761 - lr: 0.0010 Epoch 616/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1761 - lr: 0.0010 Epoch 617/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1764 - val_loss: 0.1765 - lr: 0.0010 Epoch 618/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1772 - lr: 0.0010 Epoch 619/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1782 - lr: 0.0010 Epoch 620/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1765 - lr: 0.0010 Epoch 621/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 622/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 623/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 624/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 625/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1761 - lr: 0.0010 Epoch 626/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1773 - lr: 0.0010 Epoch 627/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 628/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1766 - lr: 0.0010 Epoch 629/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1769 - lr: 0.0010 Epoch 630/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1773 - lr: 0.0010 Epoch 631/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1760 - lr: 0.0010 Epoch 632/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 633/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 634/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1766 - lr: 0.0010 Epoch 635/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1761 - lr: 0.0010 Epoch 636/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 637/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 638/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 639/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1765 - lr: 0.0010 Epoch 640/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1767 - val_loss: 0.1761 - lr: 0.0010 Epoch 641/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1766 - lr: 0.0010 Epoch 642/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1768 - val_loss: 0.1760 - lr: 0.0010 Epoch 643/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1764 - lr: 0.0010 Epoch 644/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1761 - lr: 0.0010 Epoch 645/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1771 - lr: 0.0010 Epoch 646/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 647/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1768 - lr: 0.0010 Epoch 648/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 649/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 650/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1772 - lr: 0.0010 Epoch 651/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1764 - lr: 0.0010 Epoch 652/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1763 - lr: 0.0010 Epoch 653/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1761 - lr: 0.0010 Epoch 654/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1760 - lr: 0.0010 Epoch 655/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1772 - lr: 0.0010 Epoch 656/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1773 - lr: 0.0010 Epoch 657/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1760 - lr: 0.0010 Epoch 658/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1760 - lr: 0.0010 Epoch 659/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1769 - lr: 0.0010 Epoch 660/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1764 - lr: 0.0010 Epoch 661/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1766 - lr: 0.0010 Epoch 662/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1768 - lr: 0.0010 Epoch 663/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1775 - lr: 0.0010 Epoch 664/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1767 - lr: 0.0010 Epoch 665/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1765 - lr: 0.0010 Epoch 666/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1768 - lr: 0.0010 Epoch 667/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1760 - lr: 0.0010 Epoch 668/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1766 - lr: 0.0010 Epoch 669/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1761 - lr: 0.0010 Epoch 670/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1770 - lr: 0.0010 Epoch 671/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1771 - lr: 0.0010 Epoch 672/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1765 - lr: 0.0010 Epoch 673/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1767 - val_loss: 0.1764 - lr: 0.0010 Epoch 674/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1765 - lr: 0.0010 Epoch 675/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1765 - val_loss: 0.1767 - lr: 0.0010 Epoch 676/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1764 - val_loss: 0.1762 - lr: 0.0010 Epoch 677/1000 391/391 [==============================] - 5s 12ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 678/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 679/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1763 - lr: 0.0010 Epoch 680/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1760 - lr: 0.0010 Epoch 681/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1762 - lr: 0.0010 Epoch 682/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1768 - lr: 0.0010 Epoch 683/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1765 - lr: 0.0010 Epoch 684/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1763 - lr: 0.0010 Epoch 685/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1769 - lr: 0.0010 Epoch 686/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1767 - val_loss: 0.1770 - lr: 0.0010 Epoch 687/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 688/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1765 - lr: 0.0010 Epoch 689/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1763 - lr: 0.0010 Epoch 690/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1763 - lr: 0.0010 Epoch 691/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1760 - lr: 0.0010 Epoch 692/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1772 - lr: 0.0010 Epoch 693/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 694/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1760 - lr: 0.0010 Epoch 695/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 696/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1766 - lr: 0.0010 Epoch 697/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 698/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1768 - lr: 0.0010 Epoch 699/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1772 - lr: 0.0010 Epoch 700/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1768 - lr: 0.0010 Epoch 701/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1764 - lr: 0.0010 Epoch 702/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1769 - lr: 0.0010 Epoch 703/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 704/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 705/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1765 - lr: 0.0010 Epoch 706/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1769 - lr: 0.0010 Epoch 707/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1760 - lr: 0.0010 Epoch 708/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1764 - lr: 0.0010 Epoch 709/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1762 - lr: 0.0010 Epoch 710/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 711/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1762 - lr: 0.0010 Epoch 712/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1764 - lr: 0.0010 Epoch 713/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 714/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1767 - lr: 0.0010 Epoch 715/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1765 - lr: 0.0010 Epoch 716/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1769 - lr: 0.0010 Epoch 717/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1767 - lr: 0.0010 Epoch 718/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1781 - lr: 0.0010 Epoch 719/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1769 - lr: 0.0010 Epoch 720/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1766 - val_loss: 0.1781 - lr: 0.0010 Epoch 721/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 722/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 723/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 724/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1765 - lr: 0.0010 Epoch 725/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 726/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 727/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1762 - lr: 0.0010 Epoch 728/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1760 - lr: 0.0010 Epoch 729/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1762 - lr: 0.0010 Epoch 730/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1769 - val_loss: 0.1765 - lr: 0.0010 Epoch 731/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 732/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1764 - lr: 0.0010 Epoch 733/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1766 - lr: 0.0010 Epoch 734/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1763 - lr: 0.0010 Epoch 735/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1765 - val_loss: 0.1765 - lr: 0.0010 Epoch 736/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 737/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 738/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 739/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1761 - lr: 0.0010 Epoch 740/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 741/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 742/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1776 - lr: 0.0010 Epoch 743/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1759 - lr: 0.0010 Epoch 744/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1765 - val_loss: 0.1763 - lr: 0.0010 Epoch 745/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1760 - lr: 0.0010 Epoch 746/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1765 - val_loss: 0.1765 - lr: 0.0010 Epoch 747/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 748/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1762 - lr: 0.0010 Epoch 749/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1766 - val_loss: 0.1767 - lr: 0.0010 Epoch 750/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 751/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1769 - lr: 0.0010 Epoch 752/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1770 - lr: 0.0010 Epoch 753/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1765 - val_loss: 0.1764 - lr: 0.0010 Epoch 754/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1767 - lr: 0.0010 Epoch 755/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1764 - lr: 0.0010 Epoch 756/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1762 - val_loss: 0.1762 - lr: 0.0010 Epoch 757/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 758/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1762 - val_loss: 0.1762 - lr: 0.0010 Epoch 759/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1765 - val_loss: 0.1761 - lr: 0.0010 Epoch 760/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1771 - lr: 0.0010 Epoch 761/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1768 - lr: 0.0010 Epoch 762/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 763/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1765 - lr: 0.0010 Epoch 764/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1762 - val_loss: 0.1761 - lr: 0.0010 Epoch 765/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1768 - lr: 0.0010 Epoch 766/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 767/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1762 - val_loss: 0.1762 - lr: 0.0010 Epoch 768/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 769/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1763 - lr: 0.0010 Epoch 770/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 771/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1775 - lr: 0.0010 Epoch 772/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1765 - lr: 0.0010 Epoch 773/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1759 - lr: 0.0010 Epoch 774/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 775/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 776/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1762 - val_loss: 0.1767 - lr: 0.0010 Epoch 777/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 778/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1760 - lr: 0.0010 Epoch 779/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1761 - lr: 0.0010 Epoch 780/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1763 - lr: 0.0010 Epoch 781/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1765 - val_loss: 0.1772 - lr: 0.0010 Epoch 782/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1762 - lr: 0.0010 Epoch 783/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 784/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1760 - lr: 0.0010 Epoch 785/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1760 - lr: 0.0010 Epoch 786/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1768 - lr: 0.0010 Epoch 787/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1767 - lr: 0.0010 Epoch 788/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1766 - lr: 0.0010 Epoch 789/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1759 - lr: 0.0010 Epoch 790/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 791/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1761 - lr: 0.0010 Epoch 792/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 793/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 794/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 795/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 796/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1765 - val_loss: 0.1767 - lr: 0.0010 Epoch 797/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 798/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 799/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 800/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1765 - val_loss: 0.1762 - lr: 0.0010 Epoch 801/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1788 - lr: 0.0010 Epoch 802/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1765 - lr: 0.0010 Epoch 803/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1759 - lr: 0.0010 Epoch 804/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1760 - lr: 0.0010 Epoch 805/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1761 - val_loss: 0.1762 - lr: 0.0010 Epoch 806/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1770 - lr: 0.0010 Epoch 807/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1777 - lr: 0.0010 Epoch 808/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1766 - val_loss: 0.1769 - lr: 0.0010 Epoch 809/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1765 - val_loss: 0.1770 - lr: 0.0010 Epoch 810/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1772 - lr: 0.0010 Epoch 811/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1765 - lr: 0.0010 Epoch 812/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1763 - lr: 0.0010 Epoch 813/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1768 - lr: 0.0010 Epoch 814/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1763 - lr: 0.0010 Epoch 815/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 816/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 817/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 818/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1765 - lr: 0.0010 Epoch 819/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1764 - lr: 0.0010 Epoch 820/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1764 - lr: 0.0010 Epoch 821/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 822/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1764 - lr: 0.0010 Epoch 823/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1762 - lr: 0.0010 Epoch 824/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1763 - lr: 0.0010 Epoch 825/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 826/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1761 - val_loss: 0.1759 - lr: 0.0010 Epoch 827/1000 391/391 [==============================] - 5s 13ms/step - loss: 0.1764 - val_loss: 0.1785 - lr: 0.0010 Epoch 828/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1774 - lr: 0.0010 Epoch 829/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 830/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 831/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1768 - lr: 0.0010 Epoch 832/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1763 - lr: 0.0010 Epoch 833/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1763 - lr: 0.0010 Epoch 834/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 835/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1764 - lr: 0.0010 Epoch 836/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1769 - lr: 0.0010 Epoch 837/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 838/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1766 - lr: 0.0010 Epoch 839/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 840/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1772 - lr: 0.0010 Epoch 841/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1760 - lr: 0.0010 Epoch 842/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1761 - val_loss: 0.1762 - lr: 0.0010 Epoch 843/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1762 - lr: 0.0010 Epoch 844/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1761 - val_loss: 0.1783 - lr: 0.0010 Epoch 845/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1762 - lr: 0.0010 Epoch 846/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1763 - lr: 0.0010 Epoch 847/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1761 - lr: 0.0010 Epoch 848/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1761 - val_loss: 0.1761 - lr: 0.0010 Epoch 849/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1761 - lr: 0.0010 Epoch 850/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 851/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 852/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1761 - val_loss: 0.1772 - lr: 0.0010 Epoch 853/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1759 - lr: 0.0010 Epoch 854/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 855/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 856/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1771 - lr: 0.0010 Epoch 857/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1768 - lr: 0.0010 Epoch 858/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1767 - lr: 0.0010 Epoch 859/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 860/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1764 - val_loss: 0.1760 - lr: 0.0010 Epoch 861/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1766 - lr: 0.0010 Epoch 862/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 863/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1771 - lr: 0.0010 Epoch 864/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1761 - val_loss: 0.1767 - lr: 0.0010 Epoch 865/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1761 - val_loss: 0.1763 - lr: 0.0010 Epoch 866/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1763 - lr: 0.0010 Epoch 867/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1765 - lr: 0.0010 Epoch 868/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1761 - val_loss: 0.1760 - lr: 0.0010 Epoch 869/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1765 - lr: 0.0010 Epoch 870/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1763 - lr: 0.0010 Epoch 871/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 872/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1771 - lr: 0.0010 Epoch 873/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1763 - lr: 0.0010 Epoch 874/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1761 - lr: 0.0010 Epoch 875/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1763 - val_loss: 0.1760 - lr: 0.0010 Epoch 876/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1764 - val_loss: 0.1762 - lr: 0.0010 Epoch 877/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1761 - lr: 0.0010 Epoch 878/1000 391/391 [==============================] - 5s 14ms/step - loss: 0.1762 - val_loss: 0.1789 - lr: 0.0010 Epoch 879/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1766 - val_loss: 0.1766 - lr: 0.0010 Epoch 880/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1768 - lr: 0.0010 Epoch 881/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1761 - lr: 0.0010 Epoch 882/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1777 - lr: 0.0010 Epoch 883/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1768 - lr: 0.0010 Epoch 884/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1769 - lr: 0.0010 Epoch 885/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 886/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1765 - lr: 0.0010 Epoch 887/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1764 - val_loss: 0.1779 - lr: 0.0010 Epoch 888/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1771 - lr: 0.0010 Epoch 889/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1765 - lr: 0.0010 Epoch 890/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1762 - lr: 0.0010 Epoch 891/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1765 - lr: 0.0010 Epoch 892/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 893/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 894/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1771 - lr: 0.0010 Epoch 895/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 896/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1767 - lr: 0.0010 Epoch 897/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1769 - lr: 0.0010 Epoch 898/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 899/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1765 - lr: 0.0010 Epoch 900/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1761 - lr: 0.0010 Epoch 901/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1760 - val_loss: 0.1759 - lr: 0.0010 Epoch 902/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1771 - lr: 0.0010 Epoch 903/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1763 - val_loss: 0.1763 - lr: 0.0010 Epoch 904/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1766 - lr: 0.0010 Epoch 905/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1767 - lr: 0.0010 Epoch 906/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1764 - val_loss: 0.1766 - lr: 0.0010 Epoch 907/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1760 - lr: 0.0010 Epoch 908/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1758 - lr: 0.0010 Epoch 909/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 910/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1759 - lr: 0.0010 Epoch 911/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1765 - lr: 0.0010 Epoch 912/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1763 - val_loss: 0.1767 - lr: 0.0010 Epoch 913/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1760 - val_loss: 0.1759 - lr: 0.0010 Epoch 914/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1767 - lr: 0.0010 Epoch 915/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1763 - lr: 0.0010 Epoch 916/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1763 - lr: 0.0010 Epoch 917/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 918/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1777 - lr: 0.0010 Epoch 919/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1763 - val_loss: 0.1772 - lr: 0.0010 Epoch 920/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1766 - lr: 0.0010 Epoch 921/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1763 - lr: 0.0010 Epoch 922/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 923/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1765 - lr: 0.0010 Epoch 924/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1762 - lr: 0.0010 Epoch 925/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1764 - val_loss: 0.1764 - lr: 0.0010 Epoch 926/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1762 - lr: 0.0010 Epoch 927/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1767 - lr: 0.0010 Epoch 928/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1763 - val_loss: 0.1761 - lr: 0.0010 Epoch 929/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1760 - lr: 0.0010 Epoch 930/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 931/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1773 - lr: 0.0010 Epoch 932/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1765 - lr: 0.0010 Epoch 933/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1773 - lr: 0.0010 Epoch 934/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1765 - lr: 0.0010 Epoch 935/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1764 - val_loss: 0.1762 - lr: 0.0010 Epoch 936/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1767 - lr: 0.0010 Epoch 937/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1761 - lr: 0.0010 Epoch 938/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1765 - lr: 0.0010 Epoch 939/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 940/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1765 - lr: 0.0010 Epoch 941/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1761 - lr: 0.0010 Epoch 942/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1762 - lr: 0.0010 Epoch 943/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1764 - lr: 0.0010 Epoch 944/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1764 - lr: 0.0010 Epoch 945/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1762 - lr: 0.0010 Epoch 946/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1758 - lr: 0.0010 Epoch 947/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1781 - lr: 0.0010 Epoch 948/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 949/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1760 - val_loss: 0.1760 - lr: 0.0010 Epoch 950/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 951/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1769 - lr: 0.0010 Epoch 952/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1767 - lr: 0.0010 Epoch 953/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1763 - val_loss: 0.1762 - lr: 0.0010 Epoch 954/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 955/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1768 - lr: 0.0010 Epoch 956/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1761 - val_loss: 0.1766 - lr: 0.0010 Epoch 957/1000 391/391 [==============================] - 6s 14ms/step - loss: 0.1762 - val_loss: 0.1760 - lr: 0.0010 Epoch 958/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1762 - lr: 0.0010 Epoch 959/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1760 - val_loss: 0.1761 - lr: 0.0010 Epoch 960/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1765 - lr: 0.0010 Epoch 961/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1767 - lr: 0.0010 Epoch 962/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1764 - lr: 0.0010 Epoch 963/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1764 - lr: 0.0010 Epoch 964/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1761 - lr: 0.0010 Epoch 965/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1769 - lr: 0.0010 Epoch 966/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1764 - lr: 0.0010 Epoch 967/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1763 - val_loss: 0.1766 - lr: 0.0010 Epoch 968/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1799 - lr: 0.0010 Epoch 969/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1759 - lr: 0.0010 Epoch 970/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1760 - val_loss: 0.1768 - lr: 0.0010 Epoch 971/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1769 - lr: 0.0010 Epoch 972/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1781 - lr: 0.0010 Epoch 973/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1760 - val_loss: 0.1761 - lr: 0.0010 Epoch 974/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1760 - val_loss: 0.1760 - lr: 0.0010 Epoch 975/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1759 - lr: 0.0010 Epoch 976/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1763 - val_loss: 0.1764 - lr: 0.0010 Epoch 977/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1760 - val_loss: 0.1774 - lr: 0.0010 Epoch 978/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1760 - val_loss: 0.1763 - lr: 0.0010 Epoch 979/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1762 - val_loss: 0.1761 - lr: 0.0010 Epoch 980/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1761 - val_loss: 0.1763 - lr: 0.0010 Epoch 981/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1755 - val_loss: 0.1759 - lr: 5.0000e-04 Epoch 982/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1755 - val_loss: 0.1759 - lr: 5.0000e-04 Epoch 983/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1754 - val_loss: 0.1756 - lr: 5.0000e-04 Epoch 984/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1755 - val_loss: 0.1759 - lr: 5.0000e-04 Epoch 985/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1754 - val_loss: 0.1765 - lr: 5.0000e-04 Epoch 986/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1754 - val_loss: 0.1760 - lr: 5.0000e-04 Epoch 987/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1754 - val_loss: 0.1769 - lr: 5.0000e-04 Epoch 988/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1755 - val_loss: 0.1758 - lr: 5.0000e-04 Epoch 989/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1753 - val_loss: 0.1761 - lr: 5.0000e-04 Epoch 990/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1753 - val_loss: 0.1762 - lr: 5.0000e-04 Epoch 991/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1754 - val_loss: 0.1758 - lr: 5.0000e-04 Epoch 992/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1754 - val_loss: 0.1758 - lr: 5.0000e-04 Epoch 993/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1754 - val_loss: 0.1761 - lr: 5.0000e-04 Epoch 994/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1752 - val_loss: 0.1760 - lr: 5.0000e-04 Epoch 995/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1753 - val_loss: 0.1763 - lr: 5.0000e-04 Epoch 996/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1754 - val_loss: 0.1759 - lr: 5.0000e-04 Epoch 997/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1753 - val_loss: 0.1766 - lr: 5.0000e-04 Epoch 998/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1755 - val_loss: 0.1757 - lr: 5.0000e-04 Epoch 999/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1754 - val_loss: 0.1768 - lr: 5.0000e-04 Epoch 1000/1000 391/391 [==============================] - 6s 15ms/step - loss: 0.1753 - val_loss: 0.1757 - lr: 5.0000e-04 Training MSE: 0.17490358650684357 Testing MSE: 0.1765994280576706
Notes:
It can be observed from the above graph that the training and validation losses have converged. However, validation loss has noticeably higher fluctuations as compared to training loss. The underlying reason for the same could be the use of a larger batch size. Usually, batch sizes of 128 and above would result in higher fluctuations of the validation loss. Nevertheless, the neural network has been trained to recover Heston model parameters for given values of market inputs.
The trained Keras model is saved and then loaded from the file 'heston_model_calibration.h5' so it can be used for predictions in next part of the project.
from tensorflow.keras.models import load_model
# Load the model
model = load_model('heston_model_calibration.h5')
WARNING:absl:Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model. WARNING:absl:Error in loading the saved optimizer state. As a result, your model is starting with a freshly initialized optimizer. WARNING:absl:Error in loading the saved optimizer state. As a result, your model is starting with a freshly initialized optimizer.
Part 3 - Calibrating the model parameters¶
3.1 Calibration with testing data¶
- In this case, the trained model is used to recover the parameters for 10 sets of samples. Since the calibration is on testing data, there is a reference point for each of the output parameters.
- This was done as an exercise to determine whether there were any significant deviations between the modeled parameters and actual parameters as well as to look out for any consistent behavior amongst the values of different parameters.
- It also served as a pre-cursor for calibration with market data.
import numpy as np
import matplotlib.pyplot as plt
# Use the model to predict the outputs for 10 random inputs from the test set
num_test_cases = 10
test_indices = np.random.choice(len(X_test), num_test_cases, replace=False)
selected_inputs = X_test[test_indices]
selected_actual_outputs = y_test[test_indices]
# Initialize lists to store absolute differences for 5 parameters
kappa_diff = []
theta_diff = []
sigma_diff = []
v_0_diff = []
rho_diff = []
# Parameter names for display
param_names = ['Kappa', 'Theta', 'V0', 'Sigma', 'Rho']
# Iterate over each test case and make predictions
for i, (input_data, actual_params) in enumerate(zip(selected_inputs, selected_actual_outputs)):
# Predict the parameters using the model
predicted_params = model.predict(np.expand_dims(input_data, axis=0))
# Calculate the absolute differences
differences = np.abs(predicted_params[0] - actual_params)
kappa_diff.append(differences[0])
theta_diff.append(differences[1])
v_0_diff.append(differences[2])
sigma_diff.append(differences[3])
rho_diff.append(differences[4])
# Print the inputs that were fed into the model
print(f"Test Case {i+1}:")
print(f"{'Option Price':<15}{input_data[0]:<20.10f}")
print(f"{'S0':<15}{input_data[1]:<20.10f}")
print(f"{'K':<15}{input_data[2]:<20.10f}")
print(f"{'Tau':<15}{input_data[3]:<20.10f}")
print(f"{'R':<15}{input_data[4]:<20.10f}")
print(f"{'Lambd':<15}{input_data[5]:<20.10f}\n")
# Print the results in side-by-side columns
print(f"{'Parameter':<15}{'Modeled Parameters':<20}{'Actual Parameters':<20}{'Absolute Difference':<20}")
for j, param_name in enumerate(param_names):
print(f"{param_name:<15}{predicted_params[0][j]:<20.10f}{actual_params[j]:<20.10f}{differences[j]:<20.10f}")
print("\n" + "="*60 + "\n")
# Plot the absolute differences
plt.figure(figsize=(10, 6))
plt.plot(range(1, num_test_cases + 1), kappa_diff, marker='o', linestyle='-', color='blue', label='Kappa')
plt.plot(range(1, num_test_cases + 1), theta_diff, marker='o', linestyle='-', color='green', label='Theta')
plt.plot(range(1, num_test_cases + 1), v_0_diff, marker='o', linestyle='-', color='orange', label='V_0')
plt.plot(range(1, num_test_cases + 1), sigma_diff, marker='o', linestyle='-', color='red', label='Sigma')
plt.plot(range(1, num_test_cases + 1), rho_diff, marker='o', linestyle='-', color='pink', label='Rho')
plt.xlabel('Test Case')
plt.ylabel('Absolute Difference')
plt.title('Absolute Differences of the parameters')
plt.xticks(range(1, num_test_cases + 1))
plt.legend()
plt.tight_layout()
plt.show()
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step Test Case 1: Option Price 105.2423050486 S0 376.9348978971 K 296.3284773215 Tau 1.6650837435 R 0.0200732711 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.6475424767 1.1814494995 0.4660929771 Theta 0.0579937845 0.0513167328 0.0066770517 V0 0.1242846251 0.1097057807 0.0145788443 Sigma 0.3919076622 0.0330734408 0.3588342214 Rho -0.4237608612 -0.0903668959 0.3333939653 ============================================================ 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step Test Case 2: Option Price 149.6331726627 S0 223.9368273237 K 82.2068329888 Tau 2.1682419397 R 0.0200136289 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.4554615021 0.7688132473 0.6866482548 Theta 0.2159464806 0.3853035370 0.1693570563 V0 0.2515009642 0.1860554778 0.0654454863 Sigma 0.3966646791 0.0493953472 0.3472693318 Rho -0.4423892498 -0.3402541986 0.1021350512 ============================================================ 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step Test Case 3: Option Price 164.5326370487 S0 436.4110955260 K 289.5098557607 Tau 0.4736979686 R 0.0228529739 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.3690667152 2.6425523959 1.2734856806 Theta 0.3182166219 0.3018385094 0.0163781124 V0 0.4087444544 0.4808234809 0.0720790265 Sigma 0.4504736066 0.6632664137 0.2127928071 Rho -0.4944992065 -0.2800568526 0.2144423539 ============================================================ 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step Test Case 4: Option Price 17.7253075849 S0 140.3179958215 K 176.5797478963 Tau 0.9319856252 R 0.0357788898 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.5589693785 2.0223679765 0.4633985980 Theta 0.2819414437 0.1874962541 0.0944451896 V0 0.3006759882 0.4725380311 0.1718620429 Sigma 0.4110376239 0.6365931351 0.2255555112 Rho -0.4443227053 -0.5753449004 0.1310221951 ============================================================ 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step Test Case 5: Option Price 91.9127644032 S0 269.7417453566 K 226.0097104565 Tau 2.6011470589 R 0.0216730200 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.4249286652 1.9264390106 0.5015103454 Theta 0.1514614969 0.0781743163 0.0732871806 V0 0.2408414930 0.4749482580 0.2341067650 Sigma 0.4122843742 0.4036332025 0.0086511717 Rho -0.4542059302 -0.5867712844 0.1325653542 ============================================================ 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step Test Case 6: Option Price 173.4950498841 S0 212.1168783995 K 39.8783723913 Tau 1.4222794983 R 0.0222552968 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.5340061188 1.6606876948 0.1266815760 Theta 0.2366018444 0.0309438630 0.2056579814 V0 0.2604775429 0.1289011470 0.1315763959 Sigma 0.3784299791 0.7723914710 0.3939614919 Rho -0.4334522188 -0.1966567478 0.2367954710 ============================================================ 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step Test Case 7: Option Price 114.2777411957 S0 343.1882280669 K 294.3550946071 Tau 2.0433108693 R 0.0350867562 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.3913313150 2.0821235047 0.6907921897 Theta 0.1946085691 0.1884051285 0.0062034407 V0 0.2561624348 0.1839066841 0.0722557507 Sigma 0.4223381877 0.1973205099 0.2250176778 Rho -0.4628173113 -0.6551824195 0.1923651082 ============================================================ 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step Test Case 8: Option Price 12.3689164612 S0 73.2723140903 K 106.1839038760 Tau 1.9887088021 R 0.0427735662 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.3703595400 2.1139794384 0.7436198985 Theta 0.2375463992 0.2309369970 0.0066094023 V0 0.2731899023 0.2279273339 0.0452625684 Sigma 0.4162165523 0.0742237921 0.3419927602 Rho -0.4577493370 -0.0768029503 0.3809463866 ============================================================ 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step Test Case 9: Option Price 5.5008322173 S0 55.3402712061 K 62.2723762740 Tau 1.3005634522 R 0.0201406044 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.3361787796 2.1609021617 0.8247233821 Theta 0.1436918080 0.0769776484 0.0667141596 V0 0.1472416073 0.1627801207 0.0155385134 Sigma 0.4485429823 0.7610300668 0.3124870844 Rho -0.4658103883 -0.1903731056 0.2754372828 ============================================================ 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step Test Case 10: Option Price 0.0777750035 S0 62.6451872035 K 260.7953172426 Tau 1.3252884917 R 0.0343894517 Lambd 0.0000000000 Parameter Modeled Parameters Actual Parameters Absolute Difference Kappa 1.4510521889 0.6069127799 0.8441394090 Theta 0.1796740144 0.2231289889 0.0434549744 V0 0.2360714525 0.2441425695 0.0080711170 Sigma 0.4223446846 0.5892456525 0.1669009679 Rho -0.5016000867 -0.3965120838 0.1050880029 ============================================================
- Note: It is apparent that theta is the best estimated parameter while kappa is the least. Also, all the parameters have been under-estimated in the current scenario, although it would be pre-mature to draw that conclusion since this exercise has been done only for 10 sets of data.
3.2 Calibration with market data¶
Market data :
- Source - Yahoo Finance
- Risk-free rate - value of return on thirteen week US Treasury bill
- Options - AAPL call options
The experiments have been done for 12 available maturity dates and 20 strike prices each. The algorithm involves the following steps:
- Feed in the current option price, spot price, risk-free rate, time to maturity, strike price and lambda to get the modeled values for Heston parameters.
- Using these modeled parameters and same inputs (except for maturity), calculate the price as per Heston formula. For maturity, reduce the time by 1 day. This calculated price would be “tomorrow’s price” for that option.
- Calculate the deviation between the current price and the predicted “tomorrow's price”.
Since the only difference between the inputs is of 1 day’s time to maturity, ideally, there should not be much deviations from the current price. However, it is not necessary that the results would be consistent for all the maturities alike. It is more likely that each maturity will have its own curve and fit differently to the model.
Results have been visualized using matplotlib. First graph represents the deviation curves for all 12 maturities separately. Since there is a difference between the scales for all the maturities, the next graph represents the combined graph for a visual contextualization of how the graph differs in relation to the other.
Towards the end, there's a table that represents a comprehensive analysis of the deviations for different maturities. It details the MSE, MAE, minimum absolute deviation, maximum absolute deviation, median absolute deviation and the strike price at which the deviation from current price is minimum in absolute terms.
import yfinance as yf
import numpy as np
import matplotlib.pyplot as plt
from datetime import datetime, timedelta
from sklearn.metrics import mean_squared_error, mean_absolute_error
import pandas as pd
# Fetch AAPL data
aapl_ticker = 'AAPL'
aapl_stock = yf.Ticker(aapl_ticker)
# Get the current spot price
spot_price = aapl_stock.history(period='1d')['Close'].iloc[-1]
# Fetch available option expiration dates
available_expirations = [datetime.strptime(date, "%Y-%m-%d").date() for date in aapl_stock.options]
def get_nearest_maturity(target_date):
"""Find the nearest maturity date to the target_date."""
nearest_date = min(available_expirations, key=lambda d: abs(d - target_date))
return nearest_date
def get_option_data(maturity):
"""Fetch options data for the given maturity."""
options_data = aapl_stock.option_chain(maturity.strftime('%Y-%m-%d'))
return options_data.calls
def calculate_tau(maturity_date, current_date):
"""Calculate the exact time to maturity in years."""
tau = (maturity_date - current_date).days / 365
return tau
def calibrate_options(calls, tau, rf):
"""Calibrate the options and calculate deviations."""
n_strikes = min(20, len(calls))
deviations = []
strike_prices = []
for i in range(n_strikes):
option_price = calls.iloc[i]['lastPrice']
strike_price = calls.iloc[i]['strike']
strike_prices.append(strike_price)
# Construct the input array
input_data = np.array([[option_price, spot_price, strike_price, tau, rf, lambd]])
# Predict the parameters using the trained model
predicted_params = model.predict(input_data)
# Predict the option price for one day less
tau_next_day = tau - 1/365
heston_model = Heston(spot_price, strike_price, tau_next_day, rf, predicted_params[0][0],
predicted_params[0][1], predicted_params[0][2], lambd, predicted_params[0][3],
predicted_params[0][4])
heston_price_next_day = heston_model.price(0.00001, 100, 10000)
# Compare with the current price
difference = option_price - heston_price_next_day
deviations.append(difference)
return strike_prices, deviations
# Fetch risk-free rate from ^IRX
irx_ticker = '^IRX'
irx_data = yf.Ticker(irx_ticker)
rf = irx_data.history(period='1d')['Close'].iloc[-1] / 100 # Convert percentage to decimal
# Define other parameters
lambd = 0
# Prepare for analysis
analysis_results = []
colors = plt.cm.get_cmap('tab10', 12)
# Generate target maturity dates (1 to 12 weeks from now)
current_date = datetime.now().date()
# Use the first 12 available expiration dates directly
selected_maturities = available_expirations[:12]
# Create dictionaries to store results for plotting
maturity_strike_prices = {}
maturity_deviations = {}
# Set up a 3x4 grid for the subplots
fig, axs = plt.subplots(4, 3, figsize=(18, 24))
fig.suptitle('Deviation from Actual Option Price for Different Maturities', fontsize=16)
axs = axs.flatten()
# Iterate through the maturities, plot separately, and analyze deviations
for i, maturity_date in enumerate(selected_maturities):
print(f"Calibrating for maturity date: {maturity_date}")
calls = get_option_data(maturity_date)
tau = calculate_tau(maturity_date, current_date)
strike_prices, deviations = calibrate_options(calls, tau, rf)
# Store results for combined plotting
maturity_strike_prices[maturity_date] = strike_prices
maturity_deviations[maturity_date] = deviations
# Plotting the deviations for each maturity in a grid
axs[i].plot(strike_prices, deviations, marker='o', linestyle='-',
color=colors(i), label=f'{(maturity_date - current_date).days // 7}-Weeks Maturity ({maturity_date})')
axs[i].set_xlabel('Strike Price')
axs[i].set_ylabel('Deviation')
axs[i].set_title(f'Maturity: {(maturity_date - current_date).days // 7} Weeks')
axs[i].legend()
# Plotting all deviations together on one graph with color coding
plt.figure(figsize=(12, 8))
for i, (maturity_date, deviations) in enumerate(maturity_deviations.items()):
plt.plot(maturity_strike_prices[maturity_date], deviations, 'o-', color=colors(i),
label=f'{(maturity_date - current_date).days // 7}-Weeks Maturity')
plt.xlabel('Strike Price')
plt.ylabel('Deviation from Actual Option Price')
plt.title('Deviation from Actual Option Price for All Maturities')
plt.legend()
plt.tight_layout()
plt.show()
# Analysis
for maturity_date in selected_maturities:
calls = get_option_data(maturity_date)
tau = calculate_tau(maturity_date, current_date)
strike_prices, deviations = calibrate_options(calls, tau, rf)
# Analysis: MSE, MAE, Min, Max, Median (absolute values for min, max, median)
mse = mean_squared_error(np.zeros(len(deviations)), deviations)
mae = mean_absolute_error(np.zeros(len(deviations)), deviations)
abs_devs = np.abs(deviations)
min_dev = np.min(abs_devs)
max_dev = np.max(abs_devs)
median_dev = np.median(abs_devs)
strike_min_dev = strike_prices[np.argmin(abs_devs)]
# Store results in a list for the table
analysis_results.append([f'{(maturity_date - current_date).days // 7}-Weeks ({maturity_date})', mse, mae, min_dev, max_dev, median_dev, strike_min_dev])
# Create a DataFrame for the analysis results and display it
df_results = pd.DataFrame(analysis_results, columns=['Maturity', 'MSE', 'MAE', 'Min Deviation (Abs)', 'Max Deviation (Abs)', 'Median Deviation (Abs)', 'Strike for Min Deviation'])
print("\nAnalysis Results:\n")
print(df_results.to_string(index=False))
# Find the maturity with the least MSE
best_fit_maturity = df_results.iloc[df_results['MSE'].idxmin()]['Maturity']
print(f"\nThe best fit maturity (least MSE) is: {best_fit_maturity}")
# Analysis for the most number of strikes with absolute deviation less than 1
deviation_threshold = 1
strikes_below_threshold = {}
for maturity_date in selected_maturities:
strikes_below_threshold[maturity_date] = np.sum(np.abs(maturity_deviations[maturity_date]) < deviation_threshold)
# Find the maturity with the most number of strikes having deviations less than the threshold
most_strikes_maturity = max(strikes_below_threshold, key=strikes_below_threshold.get)
most_strikes_count = strikes_below_threshold[most_strikes_maturity]
print("\nMaturity Analysis for Deviations Less Than 1:\n")
print(f"Maturity with the most number of strikes having deviation < {deviation_threshold} is: {most_strikes_maturity}")
print(f"Number of such strikes: {most_strikes_count}")
C:\Users\latik\AppData\Local\Temp\ipykernel_8684\835543257.py:73: MatplotlibDeprecationWarning: The get_cmap function was deprecated in Matplotlib 3.7 and will be removed two minor releases later. Use ``matplotlib.colormaps[name]`` or ``matplotlib.colormaps.get_cmap(obj)`` instead.
colors = plt.cm.get_cmap('tab10', 12)
Calibrating for maturity date: 2024-08-30 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 113ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step Calibrating for maturity date: 2024-09-06 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 109ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 133ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 109ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 125ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 125ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 135ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 103ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 115ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 118ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 116ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step Calibrating for maturity date: 2024-09-13 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 109ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 99ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 112ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 108ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step Calibrating for maturity date: 2024-09-20 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 126ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 106ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 128ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 126ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 104ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step Calibrating for maturity date: 2024-09-27 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 121ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 109ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 105ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 105ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 127ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 112ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 112ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 104ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 113ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 106ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step Calibrating for maturity date: 2024-10-18 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 108ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 93ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 105ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 98ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 112ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 101ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 114ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 91ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 105ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 142ms/step Calibrating for maturity date: 2024-11-15 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 113ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 112ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 128ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 103ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 64ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step Calibrating for maturity date: 2024-12-20 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 109ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 108ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 126ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 120ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 87ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 127ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 125ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 64ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 112ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 117ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 98ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 103ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 126ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step Calibrating for maturity date: 2025-01-17 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 112ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 93ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 120ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 125ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 107ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 116ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 126ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 64ms/step Calibrating for maturity date: 2025-02-21 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 112ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 137ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 115ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 100ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 115ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 104ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 93ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step Calibrating for maturity date: 2025-03-21 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 106ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 118ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 126ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 120ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 126ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 113ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 101ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 127ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step Calibrating for maturity date: 2025-04-17 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 127ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 107ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 117ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 64ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 120ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 122ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 92ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 119ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 126ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 118ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 115ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 100ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 127ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 111ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 132ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 110ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 103ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step Analysis Results: Maturity MSE MAE Min Deviation (Abs) Max Deviation (Abs) Median Deviation (Abs) Strike for Min Deviation 0-Weeks (2024-08-30) 4.329663 0.873515 0.009159 8.408184 0.273526 145.0 1-Weeks (2024-09-06) 357.550310 6.054945 0.009048 83.430254 0.704855 175.0 2-Weeks (2024-09-13) 39.660806 3.796684 0.070485 18.125658 1.187716 205.0 3-Weeks (2024-09-20) 23.968630 2.638739 0.191484 12.349036 0.954321 15.0 4-Weeks (2024-09-27) 13.486128 1.915251 0.074233 12.686789 1.008306 210.0 7-Weeks (2024-10-18) 270.120416 8.589418 0.187379 51.747649 2.321916 100.0 11-Weeks (2024-11-15) 471.416127 13.299994 0.079561 56.703578 5.869422 145.0 16-Weeks (2024-12-20) 457.227686 14.333987 0.481247 51.413859 8.180991 140.0 20-Weeks (2025-01-17) 162.209292 8.504111 0.053214 31.434685 3.969100 80.0 25-Weeks (2025-02-21) 3.946623 1.312871 0.034744 5.083989 0.678446 200.0 29-Weeks (2025-03-21) 116.209557 7.881335 0.034487 26.464662 7.281615 135.0 33-Weeks (2025-04-17) 3.223457 1.456361 0.126437 3.141818 1.474096 210.0 The best fit maturity (least MSE) is: 33-Weeks (2025-04-17) Maturity Analysis for Deviations Less Than 1: Maturity with the most number of strikes having deviation < 1 is: 2024-08-30 Number of such strikes: 18
Notes on Results:
While lower MSE might indicate best fit, it is interesting to note that the maturity with most number of deviations less than $1 might not be the same as the one with lowest value of MSE.
It is evident from the graphs that the model’s fit varies vastly across different maturities and strike prices. There are extreme fluctuations for some maturities while the others oscillate within a deviation of 10 dollars.
Furthermore, with the help of the Table, it is encouraging to see minimum min absolute deviation to be less than 1 cent and maximum min absolute deviation to be less than 50 cents. However, for maturity date 2024-09-06, while the minimum absolute deviation is less than 1 cent, there is a stark contrast with maximum absolute deviation of 83 dollars.
The median deviation is encouraging, since it is less than $2 for most of the maturities and less than $10 for all of them. Looking at the strike for min deviation column, it is good to see numbers ranging from 15 to 210. This indicates that the model can predict fairly accurate price for a wide range of strike prices.
With regards to MSE, the best fit is indicated for the maturity date 2025-04-17. However, for maturity date 2024-08-30, the absolute deviation from current price was less than $1 for 18 out of 20 strike prices. This indicates that the model was fairly accurate for short-term maturity. However, it is not consistent for other short-term maturity, so overall, the model is moderately accurate at best.
Since there are contrasting differences and inconsistencies, there is a scope of improvement in model’s generalization.
Concluding remarks¶
This project has highlighted that Neural networks have significant potential in recovering the parameters of the Heston model from available market data. It is their ability to learn complex, non-linear relationships between input data (such as option prices, spot prices, strike prices, and other financial variables) and output parameters (like kappa, theta, sigma, rho, and initial variance) makes them a valuable tool for both parameter recovery and prediction tasks.
Although the numerical results of this project were only moderately accurate, the experience provided valuable insights about the possibilities of effectively using neural networks for model calibration.
Another valuable takeaway from this project is the impact of adding randomness to training data. It was an encouraging realization that incorporating controlled randomness or noise into the training dataset can enhance the robustness and accuracy of the neural network model. By simulating the inherent variability found in real-world market data, the model can become better at generalizing from the training data to unseen examples. This was testified by the improved numerical results.
The process of adding noise to the training data could be refined to more accurately replicate the stochastic nature of market behavior. For future, it would be interesting to work on updating the noise formula to find the optimal proportions that would reflect realistic market conditions while maintaining the balance between variability and model accuracy.
On similar lines, future work could also involve experimenting with various neural network architectures, training algorithms, and regularization techniques. This experimentation can help identify the most effective approaches for recovering Heston model parameters, potentially leading to better generalization and improved prediction accuracy.